Goto

Collaborating Authors

 kernel ridge regression






1 Model,contributionsandrelatedworks Randomfeaturesmodelasa2-layersneuralnetwork. Givennobservations(x1,y1), (xn,yn) withxi Rp andyi Rforeachi=1,,n,theobjectofstudyofthispaperistheestimate bα=argmin

Neural Information Processing Systems

We establish Central Limit Theorems (CLT) for the derivatives of 2-layers NN models in(2) when n,p,d + in the proportional asymptotic regime(6). A weighted average of the gradients of the trained NN, up to an explicit additive correction, is proved to be asymptotically normal, where the variance of the limit can be estimatedexplicitly.


Targetalignmentintruncatedkernelridgeregression

Neural Information Processing Systems

Weshowthatforpolynomial alignment, there is anover-aligned regime, in which TKRR can achieve a faster rate than what is achievable by full KRR.




Networks

Neural Information Processing Systems

Despite being powerful and well-understood, the kernel ridge regression suffers from the costly computation when dealing with large datasets, since generally implementation of Eq. (1) requires O(n3) running time.